Network Capacity for Latent Attractor Computation
نویسندگان
چکیده
Attractor networks have been one of the most successful paradigms in neural computation and have been used as models of computation in the nervous system Many experimentally observed phenomena such as coherent population codes contextual representations and replay of learned neural activity patterns are explained well by attractor dynamics Recently we proposed a paradigm called latent attractors where attractors embedded in a recurrent network via Hebbian learning are used to channel network re sponse to external input rather than becoming manifest themselves This allows the network to generate context sensitive internal codes in complex situations Latent attractors are particularly helpful in explaining computations within the hippocampus a brain region of fundamental signi cance for memory and spatial learning The performance of latent attractor networks depends on the number of such attractors that a network can sustain Following methods developed for associative memory networks we present analytical and computational results on the capacity of latent attractor networks
منابع مشابه
A Gaussian Attractor Network for Memory and Recognition with Experience-Dependent Learning
Attractor networks are widely believed to underlie the memory systems of animals across different species. Existing models have succeeded in qualitatively modeling properties of attractor dynamics, but their computational abilities often suffer from poor representations for realistic complex patterns, spurious attractors, low storage capacity, and difficulty in identifying attractive fields of ...
متن کاملLatent Attractors: A General Paradigm for Context-Dependent Neural Computation
Context is an essential part of all cognitive function. However, neural network models have only considered this issue in limited ways, focusing primarily on the conditioning of a system’s response by its recent history. This type of context, which we term Type I, is clearly relevant in many situations, but in other cases, the system’s response for an extended period must be conditioned by stim...
متن کاملAn attractor neural network architecture with an ultra high information capacity: numerical results
Attractor neural network is an important theoretical scenario for modeling memory function in the hippocampus and in the cortex. In these models, memories are stored in the plastic recurrent connections of neural populations in the form of “attractor states”. The maximal information capacity for conventional abstract attractor networks with unconstrained connections is 2 bits/synapse. However, ...
متن کاملParallel Hopfield Networks
We introduce a novel type of neural network, termed the parallel Hopfield network, that can simultaneously effect the dynamics of many different, independent Hopfield networks in parallel in the same piece of neural hardware. Numerically we find that under certain conditions, each Hopfield subnetwork has a finite memory capacity approaching that of the equivalent isolated attractor network, whi...
متن کاملDesign of Continuous Attractor Networks with Monotonic Tuning Using a Symmetry Principle
Neurons that sustain elevated firing in the absence of stimuli have been found in many neural systems. In graded persistent activity, neurons can sustain firing at many levels, suggesting a widely found type of network dynamics in which networks can relax to any one of a continuum of stationary states. The reproduction of these findings in model networks of nonlinear neurons has turned out to b...
متن کامل